Goto

Collaborating Authors

 Central District


FakeNews: GAN-based generation of realistic 3D volumetric data -- A systematic review and taxonomy

arXiv.org Artificial Intelligence

With the massive proliferation of data-driven algorithms, such as deep learning-based approaches, the availability of high-quality data is of great interest. Volumetric data is very important in medicine, as it ranges from disease diagnoses to therapy monitoring. When the dataset is sufficient, models can be trained to help doctors with these tasks. Unfortunately, there are scenarios where large amounts of data is unavailable. For example, rare diseases and privacy issues can lead to restricted data availability. In non-medical fields, the high cost of obtaining enough high-quality data can also be a concern. A solution to these problems can be the generation of realistic synthetic data using Generative Adversarial Networks (GANs). The existence of these mechanisms is a good asset, especially in healthcare, as the data must be of good quality, realistic, and without privacy issues. Therefore, most of the publications on volumetric GANs are within the medical domain. In this review, we provide a summary of works that generate realistic volumetric synthetic data using GANs. We therefore outline GAN-based methods in these areas with common architectures, loss functions and evaluation metrics, including their advantages and disadvantages. We present a novel taxonomy, evaluations, challenges, and research opportunities to provide a holistic overview of the current state of volumetric GANs.


[P] Exploring Typefaces with Generative Adversarial Networks

#artificialintelligence

It seems plausible to me that the neural networks in our brain are similar to classification networks, which map inputs into a reduced space learned representation. If so, it may be that psychedelics cause those networks to map to slightly adjacent areas of the learned representation, producing hallucinations that are perceptually adjacent to the inputs. One thing I've seen gaining more traction in recent years, with some small (but quickly growing) evidence behind it and the support of some well known people in the field (David E. Nichols and Dr Robin Carhart-Harris, is that psychedelics change the larger scale network of networks like the default mode network. The idea being that psychedelics can increase or decrease signals through these networks, and that e.g. Going through all paths that aren't normally used for that purpose would explain a lot of the basic effects, and could also explain why you see things in greater detail on psychedelics (I can't find it now but David E. Nichols went through this in a presentation before, I believe he showed that e.g. a lot of visual data is thrown out at the end of the network path, and that psychedelics stop it being discarded and it instead reaches the conscious parts).


Scientific Center visitors interact with Sophia robot - Kuwait Times

#artificialintelligence

KUWAIT: The Kuwait Foundation for the Advancement of Sciences (KFAS) is eager to restore scientific culture to reach a wide sector of audience, said KFAS' deputy director general for support programs and functions Amani Al-Baddah. Speaking on the sidelines of a lecture on the'Space Month' held at the KFAS' Scientific Center, she said this event is part of a series, called KFAS Links, in various scientific fields. David Hansen, the creator of inventor of Sophia the Robot, was selected to speak in the lecture, she noted. She pointed out that there is a wide sector of young people and school students interested in artificial intelligence in particular, and science and technology in general. She indicated that the lecture was a chance to educate youngsters on the relationship between arts and technology.


Unsupervised Learning: Foundations of Neural Computation

AI Magazine

Unsupervised Learning: Foundations of Neural Computation is a collection of 21 papers published in the journal Neural Computation in the 10-year period since its founding in 1989 by Terrence Sejnowski. Neural Computation has become the leading journal of its kind. The editors of the book are Geoffrey Hinton and Terrence Sejnowski, two pioneers in neural networks. The selected papers include some of the most influential titles of late, for example, "What Is the Goal of Sensory Coding" by David Field and "An Information-Maximization Approach to Blind Separation and Blind Deconvolution" by Anthony Bell and Terrence Sejnowski. The edited volume provides a sample of important works on unsupervised learning, which cut across the fields of